Patents by Inventor Paul Michael Newman
Paul Michael Newman has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10436878Abstract: A method of localizing portable apparatus (200) obtains: a stored experience data set comprising a set of connected nodes; captured location representation data provided by at least one sensor associated with the portable apparatus, and a current pose estimate of the portable apparatus within the environment. The pose estimate is used to select a candidate set of the nodes that contain a potential match for the captured location representation data. The pose estimate is used to obtain a set of paths from path memory data, each said path comprising a set of said nodes previously traversed in the environment under similar environmental/visual conditions. The set of paths is used to refine the candidate set. The captured location representation data and the refined candidate set of nodes is compared in order to identify a current pose of the portable apparatus within the environment.Type: GrantFiled: December 4, 2015Date of Patent: October 8, 2019Assignee: The Chancellor Masters and Scholars of The University of OxfordInventors: Winston Samuel Churchill, Paul Michael Newman, Christopher James Linegar
-
Patent number: 10325381Abstract: A method of localizing portable apparatus (100) in an environment, the method comprising obtaining captured image data representing an image captured by an imaging device (106) associated with the portable apparatus, and obtaining mesh data representing a 3-dimensional textured mesh of at least part of the environment. The mesh data is processed to generate a plurality of synthetic images, each synthetic image being associated with a pose within the environment and being a simulation of an image that would be captured by the imaging device from that associated pose. The plurality of synthetic images is analyzed to find a said synthetic image similar to the captured image data, and an indication is provided of a pose of the portable apparatus within the environment based on the associated pose of the similar synthetic image.Type: GrantFiled: July 17, 2015Date of Patent: June 18, 2019Assignee: The Chancellor Masters and Scholars of The University of OxfordInventors: William Paul Maddern, Alexander Douglas Stewart, Paul Michael Newman, Geoffrey Michael Pascoe
-
Publication number: 20180342080Abstract: A method of localising portable apparatus (100) in an environment, the method comprising obtaining captured image data representing an image captured by an imaging device (106) associated with the portable apparatus, and obtaining mesh data representing a 3-dimensional textured mesh of at least part of the environment. The mesh data is processed to generate a plurality of synthetic images, each synthetic image being associated with a pose within the environment and being a simulation of an image that would be captured by the imaging device from that associated pose. The plurality of synthetic images is analysed to find a said synthetic image similar to the captured image data, and an indication is provided of a pose of the portable apparatus within the environment based on the associated pose of the similar synthetic image.Type: ApplicationFiled: July 17, 2015Publication date: November 29, 2018Applicant: The Chancellor Masters and Scholars of the University of OxfordInventors: WILLIAM PAUL MADDERN, ALEXANDER DOUGLAS STEWART, PAUL MICHAEL NEWMAN, GEOFFREY MICHAEL PASCOE
-
Patent number: 10109104Abstract: Generating a 3D reconstruction of an environment around a monitoring-unit as that monitoring-unit is moved through the environment: a) providing at least a camera and a LIDAR sensor, each being controlled by independent clocks; b) using the camera to determine the trajectory of the monitoring-unit and determining a first time series using the clock of the camera, where the first time series details when the monitoring-unit was at predetermined points of the trajectory; c) recording the returns from the LIDAR sensor and determining a second time series using the clock of the LIDAR sensor, where the second time series details when each scan from the LIDAR was taken; d) using a timer to relate the first and second series in order to match the return from the LIDAR sensor to the point on the trajectory at which the return was received; and e) creating the 3D reconstruction based upon the LIDAR returns using information from the two time series.Type: GrantFiled: February 21, 2014Date of Patent: October 23, 2018Assignee: Oxford University Innovation LimitedInventors: Paul Michael Newman, Ian Baldwin
-
Patent number: 10070101Abstract: Method and apparatus for localizing transportable apparatus (100) within an environment (120) includes obtaining (202) point cloud data (114) representing a 3D point cloud with appearance information of at least part of the environment. The method further obtains (204) first frame data (110) representing an image produced by a sensor (102) onboard transportable apparatus at a first time and location within the environment and second frame data (110) representing an image produced by the sensor at a second time and location within the environment. The method harmonizes (206) information about the first frame data, the second frame data and an overlapping subset of the point cloud data in order to determine a location within the point cloud data where at least one of the first frame and the second frame was produced, thereby localizing (208) the transportable apparatus within the environment.Type: GrantFiled: September 27, 2012Date of Patent: September 4, 2018Assignee: The Chancellor Masters and Scholars of The University of OxfordInventors: Paul Michael Newman, Alexander Douglas Stewart
-
Patent number: 9945950Abstract: A method of localizing transportable apparatus (200) within an environment includes receiving (402) data obtained from a first ranging sensor device (202) that is configured to collect information relating to a 2D representation of an environment (301) through which the transportable device is moving. Further data is received (404), that data being obtained from a second ranging sensor device (204) of the transportable apparatus configured to collect information relating to at least a surface (218) over which the transportable apparatus is moving. The ranging sensor device data is used (406) to estimate linear and rotational velocities of the transportable apparatus and the estimates are used (408) to generate a new 3D point cloud (212) of the environment. The method seeks to match (412) the new 3D point cloud with, or within, existing 3D point cloud (216) in order to localize the transportable apparatus with respect to the existing point cloud.Type: GrantFiled: April 2, 2013Date of Patent: April 17, 2018Assignee: Oxford University Innovation LimitedInventors: Paul Michael Newman, Ian Alan Baldwin
-
Patent number: 9875557Abstract: A method and system for determining extrinsic calibration parameters for at least one pair of sensing devices mounted on transportable apparatus obtains (202) image data (110) representing images captured by an image generating sensing device (102) of the pair at a series of poses during motion through an environment (120) by transportable apparatus (100). The method obtains (202) data (112) representing a 3D point cloud based on data produced by a 2D LIDAR sensing device (104) of the pair. The method selects (204) an image captured by the image at a particular pose. The method generates (210) a laser reflectance image based on a portion of the point cloud corresponding to the pose. The method computes (212) a metric measuring alignment between the selected image and the corresponding laser reflectance image and uses (214) the metric to determine extrinsic calibration parameters for at least one of the sensing devices.Type: GrantFiled: October 30, 2013Date of Patent: January 23, 2018Assignee: The Chancellor Masters and Scholars of the University of OxfordInventors: Ashley Alan Napier, Paul Michael Newman
-
Patent number: 9864927Abstract: A method of detecting the structural elements within a scene sensed by at least one sensor within a locale, the method comprising: a) capturing data from the sensor, which data provides a first representation of the sensed scene at the current time; b) generating a second representation of the sensed scene where the second representation is generated from a prior model of the locale; and c) comparing the first and second representations with one another to determine which parts of the first representation represent structural elements of the locale.Type: GrantFiled: January 20, 2014Date of Patent: January 9, 2018Assignee: Isis Innovation LimitedInventors: Colin Alexander McManus, Paul Michael Newman, Benjamin Charles Davis
-
Publication number: 20170343643Abstract: A method of localising portable apparatus (200) obtains: a stored experience data set comprising a set of connected nodes; captured location representation data provided by at least one sensor associated with the portable apparatus, and a current pose estimate of the portable apparatus within the environment. The pose estimate is used to select a candidate set of the nodes that contain a potential match for the captured location representation data. The pose estimate is used to obtain a set of paths from path memory data, each said path comprising a set of said nodes previously traversed in the environment under similar environmental/visual conditions. The set of paths is used to refine the candidate set. The captured location representation data and the refined candidate set of nodes is compared in order to identify a current pose of the portable apparatus within the environment.Type: ApplicationFiled: December 4, 2015Publication date: November 30, 2017Applicant: BAE SYSTEMS plcInventors: WINSTON SAMUEL CHURCHILL, PAUL MICHAEL NEWMAN, CHRISTOPHER JAMES LINEGAR
-
Patent number: 9689687Abstract: Navigation data is generated by receiving (502) a new experience data set (321) relating to a new experience capture. At least one stored experience data set (320) relating to at least one previous experience capture is also received (504). An experience data set includes a set of nodes, with each node comprising a series of visual image frames taken over a series of time frames. A candidate set of said nodes is obtained (506) from the stored experience data set that potentially matches a said node in the new experience data set, and then a check (508) if performed to see if a node in the candidate set matches the node in the new experience data set. If the result of the checking is positive then data relating to the matched nodes is added (510) to a place data set useable for navigation, the place data set indicating that said nodes in different said experience data sets relate to a same place.Type: GrantFiled: March 13, 2013Date of Patent: June 27, 2017Assignee: The Chancellor Masters and Scholars of The University of OxfordInventors: Paul Michael Newman, Winston Samuel Churchill
-
Publication number: 20170098128Abstract: A method of localising a vehicle hosting a sensor comprises capturing data from the sensor providing a sensed scene around the vehicle at a current time, processing the sensed scene to extract a set of features from the sensed scene and determine a position of the vehicle from the sensed scene in relation to a previous position of the sensor, comparing the extracted set of features to one or more stored experiences, wherein each stored experience comprises a plurality of sets of features, where each set of features has been determined from a previously sensed scene, to ascertain whether the sensed scene can be recognised within any of the stored experiences, and if the extracted set of features of the sensed scene are recognised then that stored experience is used to produce an estimate of the position of the vehicle at the current time with respect to the stored experience.Type: ApplicationFiled: December 14, 2016Publication date: April 6, 2017Inventors: Winston Churchill, Paul Michael Newman
-
Patent number: 9576206Abstract: A method of localizing a vehicle hosting a sensor comprises capturing data from the sensor providing a sensed scene around the vehicle at a current time, processing the sensed scene to extract a set of features from the sensed scene and determine a position of the vehicle from the sensed scene in relation to a previous position of the sensor, comparing the extracted set of features to one or more stored experiences, wherein each stored experience comprises a plurality of sets of features, where each set of features has been determined from a previously sensed scene, to ascertain whether the sensed scene can be recognized within any of the stored experiences, and if the extracted set of features of the sensed scene are recognized then that stored experience is used to produce an estimate of the position of the vehicle at the current time with respect to the stored experience.Type: GrantFiled: February 8, 2013Date of Patent: February 21, 2017Assignee: Oxford University Innovation LimitedInventors: Winston Churchill, Paul Michael Newman
-
Patent number: 9464894Abstract: A method and apparatus are disclosed for localizing a vehicle along a route. The method comprises initially surveying the route to be traversed by the vehicle by creating a plurality of orthographic images of the route, at a plurality of locations along the route and subsequently passing the vehicle along the route while obtaining a current view of the route. The method further comprises resolving a pose of the vehicle relative to a survey trajectory by maximizing a relation between the orthographic images and the current view.Type: GrantFiled: September 27, 2012Date of Patent: October 11, 2016Assignee: BAE SYSTEMS plcInventors: Ashley Alan Napier, Paul Michael Newman
-
Patent number: 9297899Abstract: A method of determining extrinsic calibration parameters for at least one sensor (102, 104, 106) mounted on transportable apparatus (100). The method includes receiving (202) data representing pose history of the transportable apparatus and receiving (202) sensor data from at least one sensor mounted on transportable apparatus. The method generates (204) at least one point cloud data using the sensor data received from the at least one sensor, each point in a said point cloud having a point covariance derived from the pose history data. The method then maximizes (206) a value of a quality function for the at least one point cloud, and uses (208) the maximized quality function to determine extrinsic calibration parameters for the at least one sensor.Type: GrantFiled: September 27, 2012Date of Patent: March 29, 2016Assignee: The Chancellor Masters and Scholars of the University of OxfordInventors: Paul Michael Newman, William Paul Maddern, Alastair Robin Harrison, Mark Christopher Sheehan
-
Publication number: 20150379766Abstract: Generating a 3D reconstruction of an environment around a monitoring-unit as that monitoring-unit is moved through the environment: a) providing at least a camera and a LIDAR sensor, each being controlled by independent clocks; b) using the camera to determine the trajectory of the monitoring-unit and determining a first time series using the clock of the camera, where the first time series details when the monitoring-unit was at predetermined points of the trajectory; c) recording the returns from the LIDAR sensor and determining a second time series using the clock of the LIDAR sensor, where the second time series details when each scan from the LIDAR was taken; d) using a timer to relate the first and second series in order to match the return from the LIDAR sensor to the point on the trajectory at which the return was received; and e) creating the 3D reconstruction based upon the LIDAR returns using information from the two time series.Type: ApplicationFiled: February 21, 2014Publication date: December 31, 2015Inventors: Paul Michael Newman, Ian Baldwin
-
Publication number: 20150356357Abstract: A method of detecting the structural elements within a scene sensed by at least one sensor within a locale, the method comprising: a) capturing data from the sensor, which data provides a first representation of the sensed scene at the current time; b) generating a second representation of the sensed scene where the second representation is generated from a prior model of the locale; and c) comparing the first and second representations with one another to determine which parts of the first representation represent structural elements of the locale.Type: ApplicationFiled: January 20, 2014Publication date: December 10, 2015Inventors: Colin Alexander McManus, Paul Michael Newman, Benjamin Guy Davis
-
Publication number: 20150331111Abstract: A method of localising transportable apparatus (200) within an environment includes receiving (402) data obtained from a first ranging sensor device (202) that is configured to collect information relating to a 2D representation of an environment (301) through which the transportable device is moving. Further data is received (404), that data being obtained from a second ranging sensor device (204) of the transportable apparatus configured to collect information relating to at least a surface (218) over which the transportable apparatus is moving. The ranging sensor device data is used (406) to estimate linear and rotational velocities of the transportable apparatus and the estimates are used (408) to generate a new 3D point cloud (212) of the environment. The method seeks to match (412) the new 3D point cloud with, or within, existing 3D point cloud (216) in order to localise the transportable apparatus with respect to the existing point cloud.Type: ApplicationFiled: April 2, 2013Publication date: November 19, 2015Inventors: Paul Michael Newman, Ian Alan Baldwin
-
Publication number: 20150317781Abstract: A method and system for determining extrinsic calibration parameters for at least one pair of sensing devices mounted on transportable apparatus obtains (202) image data (110) representing images captured by an image generating sensing device (102) of the pair at a series of poses during motion through an environment (120) by transportable apparatus (100). The method obtains (202) data (112) representing a 3D point cloud based on data produced by a 2D LIDAR sensing device (104) of the pair. The method selects (204) an image captured by the image at a particular pose. The method generates (210) a laser reflectance image based on a portion of the point cloud corresponding to the pose. The method computes (212) a metric measuring alignment between the selected image and the corresponding laser reflectance image and uses (214) the metric to determine extrinsic calibration parameters for at least one of the sensing devices.Type: ApplicationFiled: October 30, 2013Publication date: November 5, 2015Inventors: ASHLEY ALAN NAPIER, PAUL MICHAEL NEWMAN
-
Patent number: 9170334Abstract: A method of localizing transportable apparatus (100) within an environment including receiving (402) data obtained from a ranging sensor device (102) of the transportable apparatus configured to collect information relating to at least a surface (120) over which the transportable apparatus is moving in an environment, and using (404) the ranging sensor device data to generate a new 3D point cloud (110) of the environment. The method further obtains (406) data representing an existing 3D point cloud (114) of at least part of the environment, and seeks to match (408) the new 3D point cloud with, or within, the existing 3D point cloud in order to localize the transportable apparatus with respect to the existing point cloud.Type: GrantFiled: September 26, 2012Date of Patent: October 27, 2015Assignee: THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORDInventors: Ian Alan Baldwin, Paul Michael Newman
-
Publication number: 20150057921Abstract: Navigation data is generated by receiving (502) a new experience data set (321) relating to a new experience capture. At least one stored experience data set (320) relating to at least one previous experience capture is also received (504). An experience data set includes a set of nodes, with each node comprising a series of visual image frames taken over a series of time frames. A candidate set of said nodes is obtained (506) from the stored experience data set that potentially matches a said node in the new experience data set, and then a check (508) if performed to see if a node in the candidate set matches the node in the new experience data set. If the result of the checking is positive then data relating to the matched nodes is added (510) to a place data set useable for navigation, the place data set indicating that said nodes in different said experience data sets relate to a same place.Type: ApplicationFiled: March 13, 2013Publication date: February 26, 2015Inventors: Paul Michael Newman, Winston Samuel Churchill