Patents by Inventor Mark Robert Goodall
Mark Robert Goodall has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 11281213Abstract: A system and method for controlling a vehicle (100), the vehicle (100) having an autonomous mode of operation, the system comprising: one or more sensors (102) configured to measure one or more parameters associated with an operator (110) controlling the vehicle (100) to perform a first operation; and one or more processors (106) configured to: determine that measurements taken by the one or more sensors (102) fulfil one or more predetermined criteria; responsive to determining that the measurements taken by the one or more sensors (102) fulfil the one or more predetermined criteria, determine a second operation for performance by the vehicle (100); and control the vehicle (100) in the autonomous mode such that the vehicle (100) performs the second operation.Type: GrantFiled: October 18, 2017Date of Patent: March 22, 2022Assignee: BAE SYSTEMS plcInventors: Gary Martin Cross, Mark Robert Goodall
-
Patent number: 10401175Abstract: An optical inertial measurement method for determining a 6DOF pose in respect of a moving platform, the method comprising providing, in respect of said moving platform, a camera unit (10) comprised of at least three monocular cameras (14, 16, 18) connected rigidly together and configured such that their fields of view do not overlap and cover motion of said platform in each of three principal, substantially orthogonal axes; receiving video outputs from each of said cameras; determining individual point correspondences from said video outputs, and estimating therefrom, for each camera, a transform that includes translation values representative of direction of translation in the x and y axes, rotation about the optical axis and a scale factor, each transform being expressed with respect to a local 3D coordinate system associated with a respective camera; and mapping said translation values in the x and y axes to a global 3D coordinate system, having its origin defined by a point between said cameras, and multiType: GrantFiled: January 30, 2015Date of Patent: September 3, 2019Assignee: BAE SYSTEMS PLCInventors: James Duncan Revell, Mark Robert Goodall, Gary Martin Cross
-
Publication number: 20190243359Abstract: A system and method for controlling a vehicle (100), the vehicle (100) having an autonomous mode of operation, the system comprising: one or more sensors (102) configured to measure one or more parameters associated with an operator (110) controlling the vehicle (100) to perform a first operation; and one or more processors (106) configured to: determine that measurements taken by the one or more sensors (102) fulfil one or more predetermined criteria; responsive to determining that the measurements taken by the one or more sensors (102) fulfil the one or more predetermined criteria, determine a second operation for performance by the vehicle (100); and control the vehicle (100) in the autonomous mode such that the vehicle (100) performs the second operation.Type: ApplicationFiled: October 18, 2017Publication date: August 8, 2019Applicant: BAE SYSTEMS plcInventors: GARY MARTIN CROSS, MARK ROBERT GOODALL
-
Patent number: 10262465Abstract: A mixed reality system for creating a terminal control station enabling communication with and/or control of remote functions and applications, the system comprising a headset (100) for placing over a user's eyes, in use, said headset including a screen (102), the system further comprising a processor (104) configured to receive data from multiple sources and display said data on said screen within a three-dimensional virtual environment, and an input for receiving control information data from an external data source within the real world environment, said processor (104) being further configured to receive image data representative of said external data source and blend said image data into said three-dimensional virtual environment to create a mixed reality environment, including a representation of said external data source and said control information data, to be displayed on said screen (102), the system being configured to allow a user, in use, to manipulate data displayed within said mixed reality envType: GrantFiled: November 9, 2015Date of Patent: April 16, 2019Assignee: BAE Systems plcInventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford, Heather Jean Page, Mark Robert Goodall
-
Publication number: 20170351104Abstract: Optical apparatus for use with an image capture device having an optical input and an image sensor (22) defining a principal optical axis therebetween, the apparatus being configured to provide, via said input, a plurality of substantially parallel, spaced-apart optical beams to said image sensor (22), and comprising: a first optical unit (26) comprising a plurality of optical elements (10, 12, 14, 16, 18, 20), at least a first one of said optical elements being a first refractive element (12, 14, 18, 20) for refracting an optical beam incident thereon through substantially 90°; and a plurality of focusing lenses (lens0, lens1, lens2, lens3, lens4, lens5), each focusing lens being associated with a respective optical element and being configured to direct a respective incident optical beam thereon; wherein the focusing lens associated with said refractive element is arranged and configured to direct an incident optical beam thereon at substantially 90° to said principal optical axis, and the refractive eleType: ApplicationFiled: February 4, 2015Publication date: December 7, 2017Applicant: BAE SYSTEMS plcInventors: JAMES DUNCAN REVELL, MARK ROBERT GOODALL, GARY MARTIN CROSS
-
Publication number: 20170316613Abstract: A mixed reality system for creating a terminal control station enabling communication with and/or control of remote functions and applications, the system comprising a headset (100) for placing over a user's eyes, in use, said headset including a screen (102), the system further comprising a processor (104) configured to receive data from multiple sources and display said data on said screen within a three-dimensional virtual environment, and an input for receiving control information data from an external data source within the real world environment, said processor (104) being further configured to receive image data representative of said external data source and blend said image data into said three-dimensional virtual environment to create a mixed reality environment, including a representation of said external data source and said control information data, to be displayed on said screen (102), the system being configured to allow a user, in use, to manipulate data displayed within said mixed reality envType: ApplicationFiled: November 9, 2015Publication date: November 2, 2017Inventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford, Heather Jean Page, Mark Robert Goodall
-
Publication number: 20170307380Abstract: An optical inertial measurement method for determining a 6DOF pose in respect of a moving platform, the method comprising providing, in respect of said moving platform, a camera unit (10) comprised of at least three monocular cameras (14, 16, 18) connected rigidly together and configured such that their fields of view do not overlap and cover motion of said platform in each of three principal, substantially orthogonal axes; receiving video outputs from each of said cameras; determining individual point correspondences from said video outputs, and estimating therefrom, for each camera, a transform that includes translation values representative of direction of translation in the x and y axes, rotation about the optical axis and a scale factor, each transform being expressed with respect to a local 3D coordinate system associated with a respective camera; and mapping said translation values in the x and y axes to a global 3D coordinate system, having its origin defined by a point between said cameras, and multiType: ApplicationFiled: January 30, 2015Publication date: October 26, 2017Applicant: BAE SYSTEMS plcInventors: JAMES DUNCAN REVELL, MARK ROBERT GOODALL, GARY MARTIN CROSS
-
Publication number: 20160371888Abstract: A system for displaying multiple sources of data within a mixed reality environment, comprising a headset with a visor which is placed over a user's eyes in use. The visor includes a screen and the system further comprises a processor for generating a three-dimensional virtual environment to be displayed on the screen, and within which the data is arranged for viewing by the user. The system includes one or more cameras for capturing real world images in the vicinity of the user, and provides a facility for enabling selected objects from the captured images to be blended and included in the 3D virtual environment displayed on the screen. An extension of the system provides the facility for tailoring a mixed reality environment for a given application.Type: ApplicationFiled: March 9, 2015Publication date: December 22, 2016Inventors: Julian David Wright, Mark Robert Goodall, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford, Heather Jean Page
-
Publication number: 20150071492Abstract: Methods and apparatus for determining whether a provided object track (24) is abnormal, an object track (24) being a set of values of a physical property of an object (2) measured over a period of time, the method comprising: providing a model comprising one or more functions (26), each function (26) being representative of an object track (24) that is defined to be normal; assigning the provided object track (24) to a function (26); and comparing the provided object track (24) to the assigned function (26) to determine whether that object track (24) is abnormal. Providing the model comprises: for each of a plurality of objects (2), determining an object track (24), wherein the determined object tracks (24) are defined as normal object tracks (24); and using the determined tracks (24), performing a Gaussian Processes based Variational Bayes Expectation Maximisation process to learn the one or more functions (26).Type: ApplicationFiled: April 26, 2013Publication date: March 12, 2015Applicant: BAE SYSTEMS plcInventors: Jordi McGregor Barr, Yoann Paul Georges Thueux, Mark Robert Goodall