Patents by Inventor William Paul Maddern
William Paul Maddern has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Publication number: 20230244226Abstract: According to one aspect, a method includes obtaining, at a monitoring arrangement, a first supervisory request from a first vehicle, the first supervisory request arranged to indicate that the first vehicle has identified a first potential issue. The method also includes processing, at the monitoring arrangement, the first supervisory request, wherein processing the first supervisory request includes determining whether the first potential issue is to be mitigated. When it is determined that the first potential issue is to be mitigated, information is provided from the monitoring arrangement to a control arrangement, and the control arrangement takes control of the first vehicle based on the information.Type: ApplicationFiled: January 5, 2023Publication date: August 3, 2023Applicant: Nuro, Inc.Inventors: Emily Anna Weslosky, Mark Rodrigo, Benjamin Caleb Carroll, Yichao (Roger) Shen, Eric Yuhao Yi, Nicholas Tang Moy, William Paul Maddern, Minlin Zhang
-
Patent number: 11698634Abstract: A system for providing remote assistance to an autonomous vehicle is disclosed herein. The system includes at least one remote assistance button configured to be selectively activated to initiate remote assistance for the autonomous vehicle. Each remote assistance button corresponds to a dedicated remote assistance function for the autonomous vehicle. For example, the system can include remote assistance buttons for causing the autonomous vehicle to stop, decelerate, or pull over. The system includes a controller configured to detect activation of the remote assistance button(s) and to cause remote assistance to be provided to the autonomous vehicle in response to the activation. For example, the autonomous vehicle may perform an action corresponding to the activated button with input assistance from the controller but without the system taking over control of the autonomous vehicle.Type: GrantFiled: November 11, 2020Date of Patent: July 11, 2023Assignee: NURO, INC.Inventors: Emily Anna Weslosky, William Paul Maddern, Aleena Pan Byrne, Pratik Agarwal
-
Patent number: 11481884Abstract: Techniques for image quality enhancement for autonomous vehicle remote operations are disclosed herein. An image processing system of an autonomous vehicle can obtain images captured by at least two different cameras and stitch the images together to create a combined image. The image processing system can apply region blurring to a portion of the combined image to create an enhanced combined image, e.g., to blur regions/objects determined to be less import (or unimportant) for the remote operations. The image processing system can encode pixel areas of the enhanced combined image using a corresponding quality setting for respective pixel areas to create encoded image files, e.g., based on complexity levels of the respective pixel areas. The image processing system can transmit the encoded image files to a remote operations system associated with the autonomous vehicle for remote operations support.Type: GrantFiled: February 11, 2021Date of Patent: October 25, 2022Assignee: NURO, INC.Inventors: Yichao (Roger) Shen, Alexandr Bakhturin, Chenjie Luo, Albert Meixner, Ian Zhou, Hubert Hua Kian Teo, William Paul Maddern
-
Publication number: 20210383517Abstract: Techniques for image quality enhancement for autonomous vehicle remote operations are disclosed herein. An image processing system of an autonomous vehicle can obtain images captured by at least two different cameras and stitch the images together to create a combined image. The image processing system can apply region blurring to a portion of the combined image to create an enhanced combined image, e.g., to blur regions/objects determined to be less import (or unimportant) for the remote operations. The image processing system can encode pixel areas of the enhanced combined image using a corresponding quality setting for respective pixel areas to create encoded image files, e.g., based on complexity levels of the respective pixel areas. The image processing system can transmit the encoded image files to a remote operations system associated with the autonomous vehicle for remote operations support.Type: ApplicationFiled: February 11, 2021Publication date: December 9, 2021Inventors: Yichao (Roger) SHEN, Alexandr BAKHTURIN, Chenjie LUO, Albert MEIXNER, Ian ZHOU, Hubert Hua Kian TEO, William Paul MADDERN
-
Publication number: 20210149389Abstract: A system for providing remote assistance to an autonomous vehicle is disclosed herein. The system includes at least one remote assistance button configured to be selectively activated to initiate remote assistance for the autonomous vehicle. Each remote assistance button corresponds to a dedicated remote assistance function for the autonomous vehicle. For example, the system can include remote assistance buttons for causing the autonomous vehicle to stop, decelerate, or pull over. The system includes a controller configured to detect activation of the remote assistance button(s) and to cause remote assistance to be provided to the autonomous vehicle in response to the activation. For example, the autonomous vehicle may perform an action corresponding to the activated button with input assistance from the controller but without the system taking over control of the autonomous vehicle.Type: ApplicationFiled: November 11, 2020Publication date: May 20, 2021Inventors: Emily Anna Weslosky, William Paul Maddern, Aleena Pan Byrne, Pratik Agarwal
-
Patent number: 10325381Abstract: A method of localizing portable apparatus (100) in an environment, the method comprising obtaining captured image data representing an image captured by an imaging device (106) associated with the portable apparatus, and obtaining mesh data representing a 3-dimensional textured mesh of at least part of the environment. The mesh data is processed to generate a plurality of synthetic images, each synthetic image being associated with a pose within the environment and being a simulation of an image that would be captured by the imaging device from that associated pose. The plurality of synthetic images is analyzed to find a said synthetic image similar to the captured image data, and an indication is provided of a pose of the portable apparatus within the environment based on the associated pose of the similar synthetic image.Type: GrantFiled: July 17, 2015Date of Patent: June 18, 2019Assignee: The Chancellor Masters and Scholars of The University of OxfordInventors: William Paul Maddern, Alexander Douglas Stewart, Paul Michael Newman, Geoffrey Michael Pascoe
-
Publication number: 20180342080Abstract: A method of localising portable apparatus (100) in an environment, the method comprising obtaining captured image data representing an image captured by an imaging device (106) associated with the portable apparatus, and obtaining mesh data representing a 3-dimensional textured mesh of at least part of the environment. The mesh data is processed to generate a plurality of synthetic images, each synthetic image being associated with a pose within the environment and being a simulation of an image that would be captured by the imaging device from that associated pose. The plurality of synthetic images is analysed to find a said synthetic image similar to the captured image data, and an indication is provided of a pose of the portable apparatus within the environment based on the associated pose of the similar synthetic image.Type: ApplicationFiled: July 17, 2015Publication date: November 29, 2018Applicant: The Chancellor Masters and Scholars of the University of OxfordInventors: WILLIAM PAUL MADDERN, ALEXANDER DOUGLAS STEWART, PAUL MICHAEL NEWMAN, GEOFFREY MICHAEL PASCOE
-
Patent number: 9297899Abstract: A method of determining extrinsic calibration parameters for at least one sensor (102, 104, 106) mounted on transportable apparatus (100). The method includes receiving (202) data representing pose history of the transportable apparatus and receiving (202) sensor data from at least one sensor mounted on transportable apparatus. The method generates (204) at least one point cloud data using the sensor data received from the at least one sensor, each point in a said point cloud having a point covariance derived from the pose history data. The method then maximizes (206) a value of a quality function for the at least one point cloud, and uses (208) the maximized quality function to determine extrinsic calibration parameters for the at least one sensor.Type: GrantFiled: September 27, 2012Date of Patent: March 29, 2016Assignee: The Chancellor Masters and Scholars of the University of OxfordInventors: Paul Michael Newman, William Paul Maddern, Alastair Robin Harrison, Mark Christopher Sheehan
-
Publication number: 20140240690Abstract: A method of determining extrinsic calibration parameters for at least one sensor (102, 104, 106) mounted on transportable apparatus (100). The method includes receiving (202) data representing pose history of the transportable apparatus and receiving (202) sensor data from at least one sensor mounted on transportable apparatus. The method generates (204) at least one point cloud data using the sensor data received from the at least one sensor, each point in a said point cloud having a point covariance derived from the pose history data. The method then maximises (206) a value of a quality function for the at least one point cloud, and uses (208) the maximised quality function to determine extrinsic calibration parameters for the at least one sensor.Type: ApplicationFiled: September 27, 2012Publication date: August 28, 2014Applicant: THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORDInventors: Paul Michael Newman, William Paul Maddern, Alastair Robin Harrison, Mark Christopher Sheehan