Patents by Inventor Christopher James Whiteford
Christopher James Whiteford has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).
-
Patent number: 10296359Abstract: A mixed reality control apparatus for a system having at least one remote data source requiring at least one physical control input, the apparatus comprising a headset for placing over a user's eyes, in use, the headset including a screen, the apparatus further including a processor configured to receive data from the at least one remote data source and display the data on the screen within a three-dimensional virtual environment, and image capture means for capturing images of the real world environment in the vicinity of the user, the processor being further configured to: blend at least portions or objects of the images of the real world environment into the three-dimensional virtual environment to create a mixed reality environment, including the data, to be displayed on the screen; and generate a virtual representation of the at least one physical control input and blend the virtual representation into the mixed reality environment at a selected location, and generate a marker representative of the selecType: GrantFiled: February 15, 2016Date of Patent: May 21, 2019Assignee: BAE SYSTEMS PLCInventors: Christopher James Whiteford, Nicholas Giacomo Robert Colosimo, Julian David Wright
-
Patent number: 10262465Abstract: A mixed reality system for creating a terminal control station enabling communication with and/or control of remote functions and applications, the system comprising a headset (100) for placing over a user's eyes, in use, said headset including a screen (102), the system further comprising a processor (104) configured to receive data from multiple sources and display said data on said screen within a three-dimensional virtual environment, and an input for receiving control information data from an external data source within the real world environment, said processor (104) being further configured to receive image data representative of said external data source and blend said image data into said three-dimensional virtual environment to create a mixed reality environment, including a representation of said external data source and said control information data, to be displayed on said screen (102), the system being configured to allow a user, in use, to manipulate data displayed within said mixed reality envType: GrantFiled: November 9, 2015Date of Patent: April 16, 2019Assignee: BAE Systems plcInventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford, Heather Jean Page, Mark Robert Goodall
-
Patent number: 10216273Abstract: An apparatus for effecting a control action in respect of a function within a virtual or mixed reality system, the control action corresponding to a predefined bodily movement of an authorized user of said function, wherein an authorized user is defined by a predetermined criterion in respect of a selected body part and/or a passive device carried thereon. The apparatus comprises a detection module for detecting a predefined bodily movement; a multi-spectral imaging system for capturing spectral reflectance and/or emission data at a plurality of wavelengths in respect of said selected body part and/or the passive device carried thereon; and an analysis module for comparing at least a portion of the spectral data with data corresponding to an authorized user of said function to determine if said criterion is met, and outputting a signal to effect said control action only if said criterion is met.Type: GrantFiled: February 23, 2016Date of Patent: February 26, 2019Assignee: BAE Systems plcInventors: Nicholas Giacomo Robert Colosimo, Julian David Wright, Christopher James Whiteford
-
Patent number: 10096166Abstract: Apparatus for selectively displaying an operational area comprising an internal and an external environment separated from another by at least one physical obstruction, the apparatus comprising a headset including a screen for placing over a user's eyes, the system further comprising a processor configured to generate a three-dimensional virtual environment, and an image capture device for capturing images of the internal environment, said processor being configured to blend image data representative thereof into said three dimensional virtual environment to create a mixed reality environment including a representation of said at least one physical obstruction, the processor configured to receive image data representative of said external environment and to remove at least a portion of said physical obstruction from said mixed reality environment displayed on said screen, and blend said image data of said external environment into said mixed reality environment wherein said physical obstruction appears to beType: GrantFiled: November 10, 2015Date of Patent: October 9, 2018Assignee: BAE Systems plcInventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford
-
Publication number: 20180218631Abstract: A mixed reality vehicle control system comprising a headset (100) including a screen (102), the system further comprising a processor (104) configured to display images representing virtual control elements (30) within a three dimensional virtual environment on said screen, wherein the system is configured to allow a user to interact with said virtual control elements (30) to control respective vehicle functions or operations, the system further comprising an image capture device for capturing images of the real world environment in the vicinity of the user within the user's field of view, including image data representative of physical control elements (70, 80) therein, and blend image data representative of at least portions of said user's field of view, including at least one of said physical control elements, into said three dimensional virtual environment to create a mixed reality vehicle control environment.Type: ApplicationFiled: November 11, 2015Publication date: August 2, 2018Inventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford
-
Publication number: 20180188806Abstract: An apparatus for effecting a control action in respect of a function within a virtual or mixed reality system, the control action corresponding to a predefined bodily movement of an authorised user of said function, wherein an authorised user is defined by a predetermined criterion in respect of a selected body part and/or a passive device carried thereon. The apparatus comprises a detection module for detecting a predefined bodily movement; a multi-spectral imaging system for capturing spectral reflectance and/or emission data at a plurality of wavelengths in respect of said selected body part and/or the passive device carried thereon; and an analysis module for comparing at least a portion of the spectral data with data corresponding to an authorised user of said function to determine if said criterion is met, and outputting a signal to effect said control action only if said criterion is met.Type: ApplicationFiled: February 23, 2016Publication date: July 5, 2018Inventors: Nicholas Giacomo Robert Colosimo, Julian David Wright, Christopher James Whiteford
-
Patent number: 9990738Abstract: An image processing method and apparatus for determining depth in an original image captured by a light field image capture device, in which a light field analysis algorithm is applied a plurality of times to the original image, changing the focus setting each time, so as to generate a respective plurality of scene images focused at different depths; edge detection is performed in respect of each of the scene images to generate a respective plurality of edge detected images; area identification is performed in respect of each edge detected image to generate a respective plurality of area identification images indicative of areas of respective edge detected images in which edges have been detected; and the area identification images are applied to respective scene images so as to extract from the scene images respective image segments corresponding to the areas in which edges have been detected.Type: GrantFiled: February 16, 2016Date of Patent: June 5, 2018Assignee: BAE Systems plcInventors: Christopher James Whiteford, Nicholas Giacomo Robert Colosimo, Julian David Wright
-
Publication number: 20180032139Abstract: A mixed reality control apparatus for a system having at least one remote data source requiring at least one physical control input, the apparatus comprising a headset for placing over a user's eyes, in use, the headset including a screen, the apparatus further including a processor configured to receive data from the at least one remote data source and display the data on the screen within a three-dimensional virtual environment, and image capture means for capturing images of the real world environment in the vicinity of the user, the processor being further configured to: blend at least portions or objects of the images of the real world environment into the three-dimensional virtual environment to create a mixed reality environment, including the data, to be displayed on the screen; and generate a virtual representation of the at least one physical control input and blend the virtual representation into the mixed reality environment at a selected location, and generate a marker representative of the selecType: ApplicationFiled: February 15, 2016Publication date: February 1, 2018Inventors: Christopher James Whiteford, Nicholas Giacomo Robert Colosimo, Julian David Wright
-
Publication number: 20180033328Abstract: A mixed reality vehicle control, e.g. flight, simulator comprising a headset (100) for placing over a user's eyes, in use, said headset including a screen, the simulator further comprising a processor configured to display on said screen a three dimensional environment consisting of virtual scenery, one or more interactive controls (204) for enabling a user (200) to simulate vehicle control actions, said processor being further configured to receive, from said one or more interactive controls, data representative of one or more parameters determinative of vehicle movement and update said scenery displayed on said screen in accordance with said parameters so as to simulate vehicle movement therein.Type: ApplicationFiled: February 23, 2016Publication date: February 1, 2018Inventors: Christopher James Whiteford, Nicholas Giacomo Robert Colosimo, Julian David Wright
-
Publication number: 20180033157Abstract: An image processing method and apparatus for determining depth in an original image captured by a light field image capture device, in which a light field analysis algorithm is applied a plurality of times to the original image, changing the focus setting each time, so as to generate a respective plurality of scene images focused at different depths; edge detection is performed in respect of each of the scene images to generate a respective plurality of edge detected images; area identification is performed in respect of each edge detected image to generate a respective plurality of area identification images indicative of areas of respective edge detected images in which edges have been detected; and the area identification images are applied to respective scene images so as to extract from the scene images respective image segments corresponding to the areas in which edges have been detected.Type: ApplicationFiled: February 16, 2016Publication date: February 1, 2018Inventors: Christopher James Whiteford, Nicholas Giacomo Robert Colosimo, Julian David Wright
-
Publication number: 20170330381Abstract: Apparatus for selectively displaying an operational area comprising an internal and an external environment separated from another by at least one physical obstruction, the apparatus comprising a headset including a screen for placing over a user's eyes, the system further comprising a processor configured to generate a three-dimensional virtual environment, and an image capture device for capturing images of the internal environment, said processor being configured to blend image data representative thereof into said three dimensional virtual environment to create a mixed reality environment including a representation of said at least one physical obstruction, the processor configured to receive image data representative of said external environment and to remove at least a portion of said physical obstruction from said mixed reality environment displayed on said screen, and blend said image data of said external environment into said mixed reality environment wherein said physical obstruction appears to beType: ApplicationFiled: November 10, 2015Publication date: November 16, 2017Inventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford
-
Publication number: 20170316613Abstract: A mixed reality system for creating a terminal control station enabling communication with and/or control of remote functions and applications, the system comprising a headset (100) for placing over a user's eyes, in use, said headset including a screen (102), the system further comprising a processor (104) configured to receive data from multiple sources and display said data on said screen within a three-dimensional virtual environment, and an input for receiving control information data from an external data source within the real world environment, said processor (104) being further configured to receive image data representative of said external data source and blend said image data into said three-dimensional virtual environment to create a mixed reality environment, including a representation of said external data source and said control information data, to be displayed on said screen (102), the system being configured to allow a user, in use, to manipulate data displayed within said mixed reality envType: ApplicationFiled: November 9, 2015Publication date: November 2, 2017Inventors: Julian David Wright, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford, Heather Jean Page, Mark Robert Goodall
-
Publication number: 20160371888Abstract: A system for displaying multiple sources of data within a mixed reality environment, comprising a headset with a visor which is placed over a user's eyes in use. The visor includes a screen and the system further comprises a processor for generating a three-dimensional virtual environment to be displayed on the screen, and within which the data is arranged for viewing by the user. The system includes one or more cameras for capturing real world images in the vicinity of the user, and provides a facility for enabling selected objects from the captured images to be blended and included in the 3D virtual environment displayed on the screen. An extension of the system provides the facility for tailoring a mixed reality environment for a given application.Type: ApplicationFiled: March 9, 2015Publication date: December 22, 2016Inventors: Julian David Wright, Mark Robert Goodall, Nicholas Giacomo Robert Colosimo, Christopher James Whiteford, Heather Jean Page