Patents by Inventor Mark Bourne Moseley

Mark Bourne Moseley has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8989876
    Abstract: A method for improving situational awareness for teleoperation of a remote vehicle by creating a 3D map display of an area around the remote vehicle comprises: receiving an original image from a stereo vision camera and utilizing the original image to perform visual odometry to determine the x, y, z, roll, pitch, and yaw for the original image; applying a fill-in algorithm to the original image to fill in an estimated depth for areas of the original image for which no depth data is available, which creates an enhanced depth image; combining the enhanced depth image with the x, y, z, roll, pitch, and yaw for the original image to create the 3D map display of the area around the remote vehicle; and displaying the 3D map display on an operator control unit used to control the remote vehicle.
    Type: Grant
    Filed: May 9, 2014
    Date of Patent: March 24, 2015
    Assignee: iRobot Corporation
    Inventors: Scott Raymond Lenser, Mark Bourne Moseley
  • Publication number: 20140247261
    Abstract: A method for improving situational awareness for teleoperation of a remote vehicle by creating a 3D map display of an area around the remote vehicle comprises: receiving an original image from a stereo vision camera and utilizing the original image to perform visual odometry to determine the x, y, z, roll, pitch, and yaw for the original image; applying a fill-in algorithm to the original image to fill in an estimated depth for areas of the original image for which no depth data is available, which creates an enhanced depth image; combining the enhanced depth image with the x, y, z, roll, pitch, and yaw for the original image to create the 3D map display of the area around the remote vehicle; and displaying the 3D map display on an operator control unit used to control the remote vehicle.
    Type: Application
    Filed: May 9, 2014
    Publication date: September 4, 2014
    Applicant: iRobot Corporation
    Inventors: Scott Raymond Lenser, Mark Bourne Moseley
  • Patent number: 8725273
    Abstract: A method for improving situational awareness for teleoperation of a remote vehicle by creating a 3D map display of an area around the remote vehicle comprises: receiving an original image from a stereo vision camera and utilizing the original image to perform visual odometry to determine the x, y, z, roll, pitch, and yaw for the original image; applying a fill-in algorithm to the original image to fill in an estimated depth for areas of the original image for which no depth data is available, which creates an enhanced depth image; combining the enhanced depth image with the x, y, z, roll, pitch, and yaw for the original image to create the 3D map display of the area around the remote vehicle; and displaying the 3D map display on an operator control unit used to control the remote vehicle.
    Type: Grant
    Filed: February 17, 2011
    Date of Patent: May 13, 2014
    Assignee: iRobot Corporation
    Inventors: Scott Raymond Lenser, Mark Bourne Moseley
  • Publication number: 20120290152
    Abstract: A method for controlling unmanned vehicles to maintain line-of-sight between a predetermined target and at least one unmanned vehicle. The method comprises: providing an unmanned air vehicle including sensors configured to locate a target and an unmanned ground vehicle including sensors configured to locate and track the target; communicating and exchanging data to and among the unmanned ground vehicles; controlling the unmanned air vehicle and the unmanned ground vehicle to maintain line-of-sight between a predetermined target and at least one of the unmanned air vehicles; geolocating the predetermined target with the unmanned air vehicle using information regarding a position of the unmanned air vehicle and information regarding a position of the target relative to the unmanned air vehicle; and transmitting information defining the geolocation of the predetermined target to the unmanned ground vehicle so that the unmanned ground vehicle can perform path planning based on the geolocation.
    Type: Application
    Filed: July 11, 2012
    Publication date: November 15, 2012
    Inventors: Carol Carlin Cheung, Brian Masao Yamauchi, Christopher Vernon Jones, Mark Bourne Moseley, Sanjiv Singh, Christopher Michael Geyer, Benjamin Peter Grocholsky, Earl Clyde Cox
  • Patent number: 8244469
    Abstract: A collaborative engagement system comprises: at least two unmanned vehicles comprising an unmanned air vehicle including sensors configured to locate a target and an unmanned ground vehicle including sensors configured to locate and track a target; and a controller facilitating control of, and communication and exchange of data to and among the unmanned vehicles, the controller facilitating data exchange via a common protocol. The collaborative engagement system controls the unmanned vehicles to maintain line-of-sight between a predetermined target and at least one of the unmanned vehicles.
    Type: Grant
    Filed: March 16, 2009
    Date of Patent: August 14, 2012
    Assignee: iRobot Corporation
    Inventors: Carol Carlin Cheung, Brian Masao Yamauchi, Christopher Vernon Jones, Mark Bourne Moseley, Sanjiv Singh, Christopher Michael Geyer, Benjamin Peter Grocholsky, Earl Clyde Cox
  • Publication number: 20110264303
    Abstract: A method for improving situational awareness for teleoperation of a remote vehicle by creating a 3D map display of an area around the remote vehicle comprises: receiving an original image from a stereo vision camera and utilizing the original image to perform visual odometry to determine the x, y, z, roll, pitch, and yaw for the original image; applying a fill-in algorithm to the original image to fill in an estimated depth for areas of the original image for which no depth data is available, which creates an enhanced depth image; combining the enhanced depth image with the x, y, z, roll, pitch, and yaw for the original image to create the 3D map display of the area around the remote vehicle; and displaying the 3D map display on an operator control unit used to control the remote vehicle.
    Type: Application
    Filed: February 17, 2011
    Publication date: October 27, 2011
    Inventors: Scott Raymond LENSER, Mark Bourne Moseley
  • Publication number: 20100017046
    Abstract: A collaborative engagement system comprises: at least two unmanned vehicles comprising an unmanned air vehicle including sensors configured to locate a target and an unmanned ground vehicle including sensors configured to locate and track a target; and a controller facilitating control of, and communication and exchange of data to and among the unmanned vehicles, the controller facilitating data exchange via a common protocol. The collaborative engagement system controls the unmanned vehicles to maintain line-of-sight between a predetermined target and at least one of the unmanned vehicles.
    Type: Application
    Filed: March 16, 2009
    Publication date: January 21, 2010
    Inventors: Carol Carlin Cheung, Brian Masao Yamauchi, Christopher Vernon Jones, Mark Bourne Moseley, Sanjiv Singh, Christopher Michael Geyer, Benjamin Peter Grocholsky, Earl Clyde Cox